Speed-up of error backpropagation algorithm with class-selective relevance
نویسندگان
چکیده
Selective attention learning is proposed to improve the speed of the error backpropagation algorithm for fast speaker adaptation. Class-selective relevance for measuring the importance of a hidden node in a multilayer Perceptron is employed to selectively update the weights of the network, thereby reducing the computational cost for learning. c © 2002 Elsevier Science B.V. All rights reserved.
منابع مشابه
Selective Attention for Robust Recognition of Noisy and Superimposed Patterns
Based on the “early selection” filter model, a new selective attention algorithm is developed to improve recognition performance for noisy patterns and superimposed patterns. The selective attention algorithm incorporates the error backpropagation rule to adapt the attention filters for a testing input pattern and an attention cue for a specific class. For superimposed test patterns an attentio...
متن کاملAn Modified Error Function for the Complex-value Backpropagation Neural Networks
The complex-valued backpropagation algorithm has been widely used in fields dealing with telecommunications, speech recognition, and image processing with Fourier transformation. However, the local minima problem usually occurs in the process of learning. To solve this problem and to speed up the learning process, we propose a modified error function. we added a term to the conventional error f...
متن کاملA scaled conjugate gradient algorithm for fast supervised learning
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural network but requires only O(N) memory usage, where N is the number of weights in the network. The perf...
متن کاملComparison Results Between Usual Backpropagation and Modified Backpropagation with Weighting: Application to Radar Detection
This paper presents some relevant results of a novel variant of the Backpropagation Algorithm to be applied during the Multilayer Perceptrons learning phase. The novelty consists in a weighting operation when the MLP learns the weights. The purpose is to modify the Mean Square Error objective giving more relevance to less frequent training patterns and resting relevance to the frequent ones. Th...
متن کاملA novel fast learning algorithms for time-delay neural networks
To counter the drawbacks that Waibel 's time-delay neural networks (TDW) take up long training time in phoneme recognition, the paper puts forward several improved fast learning methods of 1PW. Merging unsupervised Oja's rule and the similar error back propagation algorithm for initial training of 1PhW weights can effectively increase convergence speed, at the same time error firnction almost m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neurocomputing
دوره 48 شماره
صفحات -
تاریخ انتشار 2002